Rethinking Label Refurbishment: Model Robustness under Label Noise

نویسندگان

چکیده

A family of methods that generate soft labels by mixing the hard with a certain distribution, namely label refurbishment, are widely used to train deep neural networks. However, some these still poorly understood in presence noise. In this paper, we revisit four refurbishment and reveal strong connection between them. We find they affect network models different manners. Two them smooth estimated posterior for regularization effects, other two force model produce high-confidence predictions. conduct extensive experiments evaluate related observe both effects improve generalization under Furthermore, theoretically show lead guarantees on clean distribution despite being trained noisy labels.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Robustness of Decision Tree Learning Under Label Noise

In most practical problems of classifier learning, the training data suffers from the label noise. Hence, it is important to understand how robust is a learning algorithm to such label noise. Experimentally, Decision trees have been found to be more robust against label noise than SVM and logistic regression. This paper presents some theoretical results to show that decision tree algorithms are...

متن کامل

Support Vector Machines Under Adversarial Label Noise

Battista Biggio [email protected] Dept. of Electrical and Electronic Engineering University of Cagliari Piazza d’Armi, 09123, Cagliari, Italy and Blaine Nelson [email protected] Dept. of Mathematics and Natural Sciences Eberhard-Karls-Universität Tübingen Sand 1, 72076, Tübingen, Germany and Pavel Laskov [email protected] Dept. of Mathematics and Natura...

متن کامل

Loss factorization, weakly supervised learning and label noise robustness

We prove that the empirical risk of most wellknown loss functions factors into a linear term aggregating all labels with a term that is label free, and can further be expressed by sums of the same loss. This holds true even for non-smooth, non-convex losses and in any RKHS. The first term is a (kernel) mean operator — the focal quantity of this work — which we characterize as the sufficient sta...

متن کامل

Robust Loss Functions under Label Noise for Deep Neural Networks

In many applications of classifier learning, training data suffers from label noise. Deep networks are learned using huge training data where the problem of noisy labels is particularly relevant. The current techniques proposed for learning deep networks under label noise focus on modifying the network architecture and on algorithms for estimating true labels from noisy labels. An alternate app...

متن کامل

Hyperparameter Selection under Localized Label Noise via Corrupt Validation

Existing research on label noise often focuses on simple uniform or classconditional noise. However, in many real-world settings, label noise is often somewhat systematic rather than completely random. Thus, we first propose a novel label noise model called Localized Label Noise (LLN) that corrupts the labels in small local regions and is significantly more general than either uniform or class-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i12.26751